A Modified Rank One Update Which Converges Q-Superlinearly
نویسنده
چکیده
Quasi-Newton methods are generally held to be the most efficient minimization methods for small to medium sized problems. From these the symmetric rank one update of Broyden [4] has been disregarded for a long time because of its potential failure. The work of Conn, Gould and Toint [6], Kelley and Sachs [13] and Khalfan, Byrd and Schnabel [14], [15] has renewed the interest in this method. However the question of boundedness of the generated matrix sequence has not been resolved by this work. In the present paper it is shown that a slightly modified version of this update generates bounded updates and converges superlinearly for uniformly convex functions. Numerical results support these theoretical considerations.
منابع مشابه
Superlinearly convergent exact penalty projected structured Hessian updating schemes for constrained nonlinear least squares: asymptotic analysis
We present a structured algorithm for solving constrained nonlinear least squares problems, and establish its local two-step Q-superlinear convergence. The approach is based on an adaptive structured scheme due to Mahdavi-Amiri and Bartels of the exact penalty method of Coleman and Conn for nonlinearly constrained optimization problems. The structured adaptation also makes use of the ideas of N...
متن کاملA path following method for LCP withsuperlinearly convergent iteration sequence
A new algorithm for solving linear complementarity problems with suucient matrices is proposed. If the problem has a solution the algorithm is superlinearly convergent from any positive starting points, even for degenerate problems. Each iteration requires only one matrix factorization and at most two backsolves. Only one backsolve is necessary if the problem is known to be nondegenerate. The a...
متن کاملPiecewise line-search techniques for constrained minimization by quasi-Newton algorithms
Defining a consistent technique for maintaining the positive definiteness of the matrices generated by quasi-Newton methods for equality constrained minimization remains a difficult open problem. In this paper, we review and discuss the results obtained with a new technique based on an extension of the Wolfe line-search used in unconstrained optimization. The idea is to follow a piecewise linea...
متن کاملSuperlinear Convergence of an Interior-point Method for Monotone Variational Inequalities
We describe an infeasible-interior-pointalgorithmfor monotone variational inequality problems and prove that it converges globally and superlinearly under standard conditions plus a constant rank constraint quali cation. The latter condition represents a relaxation of the two types of assumptions made in existing superlinear analyses; namely, linearity of the constraints and linear independence...
متن کاملConvergence of Alternating Least Squares Optimisation for Rank-One Approximation to High Order Tensors
The approximation of tensors has important applications in various disciplines, but it remains an extremely challenging task. It is well known that tensors of higher order can fail to have best low-rank approximations, but with an important exception that best rank-one approximations always exists. The most popular approach to low-rank approximation is the alternating least squares (ALS) method...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Comp. Opt. and Appl.
دوره 19 شماره
صفحات -
تاریخ انتشار 2001